Unsupervised feature selection based on kernel fisher discriminant analysis and regression learning

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Kernel discriminant analysis based feature selection

For two-class problems we propose two feature selection criteria based on kernel discriminant analysis (KDA). The first one is the objective function of kernel discriminant analysis called the KDA criterion. We show that the KDA criterion is monotonic for the deletion of features, which ensures stable feature selection. The second one is the recognition rate obtained by a KDA classifier, called...

متن کامل

Discriminant Analysis for Unsupervised Feature Selection

Feature selection has been proven to be efficient in preparing high dimensional data for data mining and machine learning. As most data is unlabeled, unsupervised feature selection has attracted more and more attention in recent years. Discriminant analysis has been proven to be a powerful technique to select discriminative features for supervised feature selection. To apply discriminant analys...

متن کامل

Robust Kernel Fisher Discriminant Analysis

Kernel methods have become standard tools for solving classification and regression problems in statistics. An example of a kernel based classification method is Kernel Fisher discriminant analysis (KFDA), a kernel based extension of linear discriminant analysis (LDA), which was proposed by Mika et al. (1999). As in the case of LDA, the classification performance of KFDA deteriorates in the pre...

متن کامل

Fault Diagnosis Based on Improved Kernel Fisher Discriminant Analysis

There are two fundamental problems of the Kernel Fisher Discriminant Analysis (KFDA) for nonlinear fault diagnosis. The first one is the classification performance of KFDA between the normal data and fault data degenerates as long as overlapping samples exist. The second one is that the computational cost of kernel matrix becomes large when the training sample number increases. Aiming at the tw...

متن کامل

Variable selection in kernel Fisher discriminant analysis by means of recursive feature elimination

Variable selection serves a dual purpose in statistical classification problems: it enables one to identify the input variables which separate the groups well, and a classification rule based on these variables frequently has a lower error rate than the rule based on all the input variables. Kernel Fisher discriminant analysis (KFDA) is a recently proposed powerful classification procedure, fre...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Machine Learning

سال: 2018

ISSN: 0885-6125,1573-0565

DOI: 10.1007/s10994-018-5765-6